Goto

Collaborating Authors

 minimax optimal alternating minimization


Minimax Optimal Alternating Minimization for Kernel Nonparametric Tensor Learning

Neural Information Processing Systems

We investigate the statistical performance and computational efficiency of the alternating minimization procedure for nonparametric tensor learning. Tensor modeling has been widely used for capturing the higher order relations between multimodal data sources. In addition to a linear model, a nonlinear tensor model has been received much attention recently because of its high flexibility. We consider an alternating minimization procedure for a general nonlinear model where the true function consists of components in a reproducing kernel Hilbert space (RKHS). In this paper, we show that the alternating minimization method achieves linear convergence as an optimization algorithm and that the generalization error of the resultant estimator yields the minimax optimality. We apply our algorithm to some multitask learning problems and show that the method actually shows favorable performances.



Reviews: Minimax Optimal Alternating Minimization for Kernel Nonparametric Tensor Learning

Neural Information Processing Systems

This paper presents a new non-parametric tensor regression method based on kernels. More specifically, the authors proposed a regularization based optimization approach with alternating minimization for non-parametric tensor regression model [15]. Moreover, the theoretical guarantee for the proposed method is presented the paper. Through experiments on various datasets, the proposed method compares favorably with existing state-of-the-art. The paper is clearly written and easy to read. I understand the key contribution of this paper is the theoretical analysis of the non-parametric tensor regression.


Minimax Optimal Alternating Minimization for Kernel Nonparametric Tensor Learning

Suzuki, Taiji, Kanagawa, Heishiro, Kobayashi, Hayato, Shimizu, Nobuyuki, Tagami, Yukihiro

Neural Information Processing Systems

We investigate the statistical performance and computational efficiency of the alternating minimization procedure for nonparametric tensor learning. Tensor modeling has been widely used for capturing the higher order relations between multimodal data sources. In addition to a linear model, a nonlinear tensor model has been received much attention recently because of its high flexibility. We consider an alternating minimization procedure for a general nonlinear model where the true function consists of components in a reproducing kernel Hilbert space (RKHS). In this paper, we show that the alternating minimization method achieves linear convergence as an optimization algorithm and that the generalization error of the resultant estimator yields the minimax optimality. We apply our algorithm to some multitask learning problems and show that the method actually shows favorable performances.


Minimax Optimal Alternating Minimization for Kernel Nonparametric Tensor Learning

Suzuki, Taiji, Kanagawa, Heishiro, Kobayashi, Hayato, Shimizu, Nobuyuki, Tagami, Yukihiro

Neural Information Processing Systems

We investigate the statistical performance and computational efficiency of the alternating minimization procedure for nonparametric tensor learning. Tensor modeling has been widely used for capturing the higher order relations between multimodal data sources. In addition to a linear model, a nonlinear tensor model has been received much attention recently because of its high flexibility. We consider an alternating minimization procedure for a general nonlinear model where the true function consists of components in a reproducing kernel Hilbert space (RKHS). In this paper, we show that the alternating minimization method achieves linear convergence as an optimization algorithm and that the generalization error of the resultant estimator yields the minimax optimality. We apply our algorithm to some multitask learning problems and show that the method actually shows favorable performances.